Distributed Optimization of Convex Sum of Non-Convex Functions

نویسندگان

  • Shripad Gade
  • Nitin H. Vaidya
چکیده

We present a distributed solution to optimizing a convex function composed of several nonconvex functions. Each non-convex function is privately stored with an agent while the agents communicate with neighbors to form a network. We show that coupled consensus and projected gradient descent algorithm proposed in [1] can optimize convex sum of non-convex functions under an additional assumption on gradient Lipschitzness. We further discuss the applications of this analysis in improving privacy in distributed optimization.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A generalized form of the Hermite-Hadamard-Fejer type inequalities involving fractional integral for co-ordinated convex functions

Recently, a general class of the Hermit--Hadamard-Fejer inequality on convex functions is studied in [H. Budak, March 2019, 74:29, textit{Results in Mathematics}]. In this paper, we establish a generalization of Hermit--Hadamard--Fejer inequality for fractional integral based on co-ordinated convex functions.Our results generalize and improve several inequalities obtained in earlier studies.

متن کامل

SIZE AND GEOMETRY OPTIMIZATION OF TRUSS STRUCTURES USING THE COMBINATION OF DNA COMPUTING ALGORITHM AND GENERALIZED CONVEX APPROXIMATION METHOD

In recent years, the optimization of truss structures has been considered due to their several applications and their simple structure and rapid analysis. DNA computing algorithm is a non-gradient-based method derived from numerical modeling of DNA-based computing performance by new computers with DNA memory known as molecular computers. DNA computing algorithm works based on collective intelli...

متن کامل

Katyusha X: Practical Momentum Method for Stochastic Sum-of-Nonconvex Optimization

The problem of minimizing sum-of-nonconvex functions (i.e., convex functions that are average of non-convex ones) is becoming increasingly important in machine learning, and is the core machinery for PCA, SVD, regularized Newton’s method, accelerated non-convex optimization, and more. We show how to provably obtain an accelerated stochastic algorithm for minimizing sumof-nonconvex functions, by...

متن کامل

Distributed Non-Convex First-Order Optimization and Information Processing: Lower Complexity Bounds and Rate Optimal Algorithms

We consider a class of distributed non-convex optimization problems often arises in modern distributed signal and information processing, in which a number of agents connected by a network G collectively optimize a sum of smooth (possibly non-convex) local objective functions. We address the following fundamental question: For a class of unconstrained non-convex problems with Lipschitz continuo...

متن کامل

Singular values of convex functions of matrices

‎Let $A_{i},B_{i},X_{i},i=1,dots,m,$ be $n$-by-$n$ matrices such that $‎sum_{i=1}^{m}leftvert A_{i}rightvert ^{2}$ and $‎sum_{i=1}^{m}leftvert B_{i}rightvert ^{2}$  are nonzero matrices and each $X_{i}$ is‎ ‎positive semidefinite‎. ‎It is shown that if $f$ is a nonnegative increasing ‎convex function on $left[ 0,infty right) $ satisfying $fleft( 0right)‎ ‎=0 $‎, ‎then  $$‎2s_{j}left( fleft( fra...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1608.05401  شماره 

صفحات  -

تاریخ انتشار 2016